![]() Input association for touch sensitive surface
专利摘要:
The present invention relates to a method for cooperating at a touch-sensitive surface in a system for cooperating input, which method comprises. selecting an displayed icon by touching the touch-sensitive surface; and. the present invention also relates to a method of cooperating at a touch-sensitive surface and a method of cooperating in a system comprising an interactive display surface with a touch screen, and a cooperating input system comprising a touch-sensitive surface for receiving a plurality of touch inputs, and a cooperating input system comprising a touch-sensitive interactive display surface.Figure 1 公开号:SE1150221A1 申请号:SE1150221 申请日:2011-03-14 公开日:2012-04-16 发明作者:Nigel Pearce 申请人:Promethean Ltd; IPC主号:
专利说明:
1015202530352surface other than by the use of distinctive form. When several touch inputs are providedheld with the same type of input, for example input with finger touch, there is nonemechanism for distinguishing between such inputs at the interactive surface. In scenarios with input from several, it can be beneficial for the applications to be able todistinguish between inputs provided by different users. An object of the present invention is therefore to provide an arrangement fora touch-sensitive surface that makes it possible to distinguish between inputs from different users. More generally, an object of the present invention is to provide an improved deviceuser interface for a collaborative input system, which includes a touch interfacesensitive surface. Summary of the inventionIn one aspect, the invention provides a method of cooperating with one anothertouch-sensitive surface of a cooperating input system, which method comprisesthe following: selection by touching an displayed icon on the touch-sensitive surface; andproviding additional input at the touch-sensitive surface, this furtherinput is associated with the selected icon. This additional input can be associated with the selected icon until another iconväüs. This additional input can be associated with the selected icon until the selected iconis deleted. The method may further comprise a step in selecting the displayed icon to itdisplayed icon is associated with a user. The method may further comprise a stepwith a user registration process when selecting the displayed icon to identifythe user. The step of selecting an displayed icon may involve selecting and dragging the icon to oneedge of the touch-sensitive surface. In response to the demonstration that an icon is being drawnthe edge of the touch-sensitive surface, the icon can be selected. When selected, a selected icon can be aligned relative to an edge of the touch screen.sensitive surface. The selected icon can be aligned with the edge to which it is drawn. When selected, the icon can be placed at a preset distance from an edge of ittouch-sensitive surface. When selecting the displayed icon, a number of identification options may existchoice. The possibilities can include a number of avatars or a number of images. The choice of anotherdisplayed icon can confirm a user's choice among the options. When a number of icons are selected1015202530353by a number of users, a user selection process can be maintained until all usershas confirmed each of all possible possibilities for each of the plurality of selected icons. The method may further include repositioning the displayed and selected icon onthe touch-sensitive surface. A number of users can select a number of displayed icons. Two or more of thesea plurality of displayed icons may be grouped. The grouping can be denoted by andisplayed identifier on the grouped user icons. The grouping of user iconscan be determined by a user who selects a group identifier for their displayed icon. A selected user icon is deleted by dragging the displayed icon fromthe edge of the touch-sensitive surface. The method may further comprise the step of defining at least one active area on itthe sensitive area, the active area being reserved for use by one or moremultiple users. The method may further comprise the step of defining at least one active area on itthe sensitive area, the active area being reserved for use by one or moreseveral applications. The method may further comprise the step of defining several active areas on ittouch-sensitive surface, the active areas being defined by the selection ofuser icons. Each selected icon can be associated with a menu. The menu can be displayed bythe icon is selected. The method may further comprise the step of determining a position for a userassociated with the displayed icon. The user's position can be determined based onthe position of the displayed icon. The method may further comprise providing a plurality of displayed icons,wherein each displayed icon is intended for association with a particular identity. the identity canbe a user identity. The method may further comprise selecting an option which is presented after the choice of itdisplayed the icon and the association of the selected option with this icon. The opportunity isa tool selected from a tool menu, the tool being displayed on the touch screenthe surface, all associated input resulting from the use of the tool being associated withthe displayed icon. The displayed icon can represent a database. The procedure for cooperating on a touch-sensitive surface may further comprise the following:locating the position of a plurality of users in relation to the touch-sensitive surfacedepending on the location of a corresponding number of displayed icons, associated with the numberby users; and associating input at the touch-sensitive surface with a specific1015202530354users depending on the location of the input that is adjacent to the located positionfor the user. Input can be determined to be adjacent to a localized position for oneusers if it is within a certain area around the located position ofthe user. The step of locating a location for multiple users may includedetermining the position of the users in relation to the edges of ittouch-sensitive surface. The method may further comprise the step of determining the numberuser. The method may further include the step of tracking input from each user. The procedure may further comprise providing information on the contribution made byeach user. A system event can be determined by a collective response fromall users. The method may further comprise association with the displayed icon of whichany function that has the same location as this. A function can be associated with itdisplayed icon if another displayed icon representing this function ispositioned so that it coincides with the displayed icon. The method may further comprise providing a plurality of icons, eachassociated with one of a number of users, two or more displayed icons being associated withwith a grouping. A common grouping for two or more displayed iconsdenoted by a common element of the displayed icons. One or more in advancefixed groups are provided, and further include associating an displayed icon witha grouping that the displayed icon is associated with a predetermined group. Inputassociated with an displayed icon associated with a grouping is tracked fordetermining an individual's contribution to a group. The method may further comprise providing a plurality of displayed icons, eachand one for association with one of a number of users, wherein a step of an application such asrunning on a computer system in connection with the touch-sensitive surface is dependent on inputassociated with each of the majority of icons. The step of the application may be dependentof each of the plurality of icons associated with a user. The displayed icon may be associated with a user, with a current oneposition of the displayed icon represents the current position of the user inrelation to the touch-sensitive surface. The user's position is determined depending on thethe displayed icon is located at or near an edge of the touch-sensitive surface. The method may further comprise a plurality of displayed icons, each associatedwith one of a plurality of users, the display surface of the touch screen being formed by at least one working surface.area which is a subdivision of the whole area, with input detected within itworkspace is associated with a user depending on the displayed icon that isassociated with this user settles within the work area.1015202530355The method may further comprise selecting a tool or object associated with itexhibited the icon, all possible interactions with the tool or object being associatedwith the displayed icon. The displayed icon may be associated with a user,and the interaction with the tool or object is associated with the user. The tool orthe object can be selected by an association with or a selection of the displayed icon. In one aspect, the invention provides a method of cooperating with one anothertouch-sensitive surface, which includes the following: determining the position relative tothe surface of a plurality of users; and allocating input to one or more users dependingof the determined positions on the basis of the location of this input in relation to those onpre-determined positions. The procedure may further include tracking the contribution of each user. Proceduremay further include tracking the contribution of each user within a co-operativetask. The position of the user can be determined depending on the position of an displayed iconassociated with the user. The displayed icon is placed by the user. Thedisplayed icon is placed along an edge of the touch-sensitive surface. In one aspect, the invention provides a method of cooperating with one anothertouch-sensitive surface, which includes the following: the definition of an area associated with auser; and associating all information presented in this area with ituser. The area can be defined by a physical area of the touch-sensitive surface. The area can be defined by a graphic icon displayed on the touch screensurface. The association step may involve selecting and dragging displayed content into itdefined the area. In one aspect, the invention provides a method of cooperating with one anothertouch-sensitive surface, which includes the following: each of a plurality of users is associatedwith a group; and group input is tracked. The step of associating each of a number of users with a group caninclude each user joining a group. The method may further comprise providing a number of user groups. The group can be defined by an application, and the association of a user witha group involves the user joining the group. In one aspect, the invention provides a method of cooperating with one anothertouch-sensitive interactive display surface, which includes the display of a plurality of userinterface elements, each associated with one of a number of users, one stepof an application running on a computer system associated with the touch sensitive1015202530356the interactive display depends on a selection made by each user at each userinterface elements. The work of an application running on a computer system in connection with ittouch-sensitive interactive display may depend on the choice made by eachusers at each user interface element is the same choice. The choice made by each user may include a choice of a user identity. In one aspect, the invention provides a method of tracking the positionof a user relative to the edge of a surface of a touch-sensitive interactivedisplay area, which includes providing a user icon for each user thatis displayed on the display surface, the current position of the displayed user iconrepresents the current position of the user. Movement of the user icon of a user can represent movement of auser position. Inputs detected within an area near the user icon can be associated with itusers associated with the user icon. In one aspect, the invention provides a method of co-operation in onesystem comprising an interactive display surface with a touch screen, which includesproviding a plurality of user icons representing a respective plurality ofuser; dividing the interactive display surface with a touch screen into a pluralitywork areas, each work area being arranged to receive input from one or fl erasspecified users, the users being specified by the location of theirassociated user icons associated with the workspace. The work areas can be fi niered by the proximity to parts of the edges of itinteractive display area. The invention provides a computer program for carrying out any of themthe method steps described above. The invention provides a computer software product for storing computer software codefor performing any of the method steps described above. According to additional aspects, the uplink provides a collaborationinput system adapted to perform any of the method steps described above. Brief description of the drawingsThe invention will now be described by way of example with reference to thoseattached fi gures, to which the following applies:Figure 1 illustrates the selection of a user symbol according to an embodiment of the invention.1015202530357Figure 2 illustrates a process within the selection of a user symbol according to an embodimentof recovery. Figure 3 illustrates the alignment of a user symbol according to an embodiment ofthe invention. Figure 4 illustrates the registration of a user symbol according to an embodiment ofrecovery. Figure 5 illustrates the association of a user symbol according to an embodiment ofthe invention. Figure 6 illustrates tracking of user input according to an embodiment of the invention. Figure 7 illustrates a process for tracking user input according to an embodiment ofthe invention. Figure 8 illustrates the selection of user symbols according to embodiments of the invention. Figure 9 illustrates user registration of symbols within an initialization orregistration process according to an embodiment of the invention. Figure 10 illustrates the association of tools with user symbols according to the embodimentforms of acquisition. Figure 11 illustrates the association of a game with user symbols according to an embodimentform of the acquisition. Figure 12 illustrates the association of images for editing with user symbols according toan embodiment of the invention. Description of preferred embodimentsThe invention will now be described by means of references to various examples,embodiments and advantageous applications. Those skilled in the art will appreciate that the invention does notis limited to the details of any described examples, embodiments or details. In particularThe invention can be described with reference to, for example, interactive display systems. Those skilled in the art will appreciate that the principles of the invention are not limited to any suchdescribed systems. The invention is described here with reference to a touch-sensitive interactive display.area for cooperation. The invention is particularly described in connection with such a surface asprovided as a horizontal surface or table surface, but is not limited to onespecific user arrangement. The invention is not limited to any particular type of touch sensitive technology andnor to any particular type of display technology. In some examples, the display of it maythe touch-sensitive surface is provided by a projector projecting images onto ittouch-sensitive surface. in other examples, the display can be obtained by1015202530358the touch-sensitive surface is an emitting surface. There are various other possibilities, according to whatprofessionals should realize. In general, the surface 100 is described herein as a touch-sensitive surface, onwhich images can be projected (eg with a projector) or which can also be oneemitting display surface. An arrangement according to the invention provides a method for providinghold input at a touch-sensitive surface of a collaborative input system, whichincludes selection by touching an displayed icon on the touch-sensitive surface, andproviding additional input by touching the touch-sensitive surface, whereinthis additional input is associated with the selected icon. A system for collaborative inputis according to what is known a system within which several users provide input to the system. The collaborative input system provides a workspace for multiple users. The work within a first example of an arrangement according to the invention is describedwith reference to Figures 1 and 2. Referring to Figure 1 (a), reference numeral 100 generally illustrates a touch sensitiveinteractive display area. The surface 100 constitutes a touch-sensitive area, with which a display isassociated. As shown in the example in Figure 1 (a), four graphical user interfaces are shown.icons (GUI icons) or icons displayed on the display of the touch-sensitive surface 100. These icons are denoted by the reference numerals 102a to 102d. Within a preferred embodimentshape, the icons can also be called symbols. The term symbols is used within thisdescription. As shown in Figure 2, step 1002 denotes the display of a number of symbols. The symbols can be displayed anywhere on the display, and the arrangement in Figure 1 (a) isillustrative only. In general, two or fl your symbols can be displayed on the touch screeninteractive display area. The icons 102a to 102d displayed on the touch screen representsymbols that can be selected and assigned to an association, as further described in itfollowing. For example, a symbol can be associated with a user. For the purpose of describing embodiments, it is assumed that the touch-sensitive surface100 is positioned as a horizontal surface and forms an interactive surface on a table surface. Usercan thus stand around the surface and be located at any of the four edges of the surface. withinin other embodiments, the surface may have a shape other than the rectangular shape shown inur gurerna. As shown in the example in Figure 1 (b), two symbols are selected. A user can select onesymbol 102c using its hand 104c and select the symbol with a finger. Alsoa user can select a symbol 102b using his hand 104b, whereinthe symbol is selected with one finger.1015202530359The selection of symbols is illustrated in Figure 2 with step 1004. If no symbol is selected in step1004 the symbols continue to appear. If a symbol is selected, it applies within a first arrangementthat an association is made in steps 1006 and 1008.In a step 1006, association possibilities for the chosen person are presented or presentedthe symbol. The association possibilities may vary depending on the application orteringen. In general, the association causes an identity to be associated with a symbol. Exampleon association possibilities is also given in the description in the following. For this examplepurpose, an association is established between a symbol and a user. The possibilities shown in step 1006 in this example are thus possibilitiesfor user identification. The options can be displayed as a list of registered usersor as an opportunity to create a user. The ability to create a user caninclude the ability to create a user account to formally identify a user. The ability to create a user may include the ability to create a user identity,for example by selecting an avatar, so that a user identity is created without verification ofuser identity - in practice a unique anonymous user.In step 1008, the user selects the appropriate association, in this case oneuser identity. Then, within a step 1010, the selected association is stored withthe symbol. After determining the choice of one symbol within step 1004, it applies within a secondarrangement, that instead of proceeding directly to step 1006, the process continues with steps1005. The step of presenting the association possibilities in step 1006 is made possible only afterthat a symbol has been selected and moved, in particular after the symbol has been determined to havemoved to an area of the touch screen that triggers a selection forassociation. This optional step 1005, which is performed between steps 1004 and 1006,is now described in more detail with reference to Figures 1 (b) and 1 (c). Through this arrangementit is determined within step 1004 whether a symbol has been selected by means of a contact, and withinIn step 1005, it is then determined whether this contact - or this initial choice - is a choicefor the purpose of defining an association. As shown by arrows 106c and 106b in Figure 1 (b), the hand 104c is moved in contact withsymbol 102c by the user in the direction of the arrow 106c toward a side or edge thereoftouch-sensitive surface 100. Likewise, the hand 104b in contact with the symbol 102b is moved offthe user in the direction of the arrow 106b towards another side or edge of the touch sensitivesurface 100. In order for any of the symbols 102 to move, the contact is maintainedwhile the hand is moved towards the edge of the touch-sensitive surface 100, i.e. the finger must remainin contact with the surface of the display.10152025303510As shown in Fig. 1 (c), the symbol 102c is moved to an edge of the touch screen.sensitive surface of the hand 104c, and the symbol 102b is moved to another edge thereofthe touch-sensitive surface 100 of the hand 104b. According to the possible step 1005, the demonstration that a symbol is moved andplaced in a place next to an edge of the touch-sensitive surface triggers the selection of this symboland triggers the steps for defining an association. The association possibilities are presentedthen within step 1006. Then follow steps 1008 and 1010 as described thereinprevious.within this exemplary arrangement applies, as has been discussed in thereferring to step 1005, that a symbol may be considered to have been selected for allocating oneassociation, for example with a user, only when it has been moved to a location within a certaindistance from an edge of the touch-sensitive surface 100. Within an arrangement, a border mayappears along the edge of the display in the touch-sensitive surface 100, and when a symbol102 is moved to a place within the displayed border, it can be considered to have been reserved or selectedfor an association, eg with a user. Such a border can appear temporarily under oneinitial initialization or registration process. The choice can be triggered by the detection ofthe contact, whereby the symbol is released or removed when the symbol is withina certain place or within a certain distance from an edge of the display. The location may require thatthe displayed symbol goes against the edge of the display area. Thus, within the exemplary arrangement described above, the fact thata symbol is dragged to an edge of the display surface a selection of this symbol for the purpose of de iera nieran association. Figures 1 (a) to 1 (c) illustrate an example of an initial step, such as an initialization step.process, a registration process or a planning process, for an application with atouch-sensitive surface, with users selecting symbols to define a user-sensitiveassociation. within an example of an arrangement, a user association can be definedsimply after selecting an displayed symbol by touch. In another example of onearrangements, a user association can only be de-initiated after the symbol has been selected bytouch and then placed within a certain area of the display surface. Other types ofassociation can be defined in a similar way, independently or in combination with aassociation of user identity. Two users have selected the symbols 102c and 102b, respectively. When selecting, a symbol can be directed to a specific arrangement. For example, the symbol canwithin the above example is aligned with respect to the edge of the display against which itdras. The alignment of a symbol includes aligning in such a way that information such asdisplayed in or together with a symbol is displayed optimally. Text, such as a username oralias, or an image, such as an avatar or a photograph, is optimally placed by the symbolaligned with the edge of the display. This preferably involves aligning the symbol10152025303511so that the content is aligned with an edge, for example so that a user standing at that edgecan see it optimally. The alignment is preferably after the symbol has been placed withinedge region and / or before or after association. Figure 2 indicates this possible alignment in a step 1012, after the associationhas been completed, as this provides the information to be targeted. The direction canhowever, as soon as the choice of the symbol is detected, such as when a symbol has been detectedhas been pulled towards a position at or near the edge of the display surface and then released. Figure 3 shows an example of alignment. As shown in Figure 3 (a) beginssymbol 102c within an area at or near an edge 1024 of the touch sensordisplay area, within a border region nominally defined by an area betweendisplay edge 1024 and a line 1022 across the surface of the display, parallel to the edge of the display. lFigure 3 (a) also shows the hand 104c with which the symbol 102c is drawn into the border area. Figure 3 (a) also shows the contents 1020 of the symbol 102c, as display informationwhich appears inside the symbol 102c. In most scenarios, the association is expected forsymbol not yet defined. The symbol can simply have a label, such as "Symbol". Alternatively, the symbol may lack information, but it is still aimed in such a way that wheninformation is displayed, it is correctly directed. As shown in Figure 3 (a), this content is 1020 atthe initial position not correctly aligned relative to the edge 1024. The direction atthe initial placement may be random. After removing ittouch contact obtained by the hand 104c on the symbol 102c, as shown in FIG3 (b), the symbol 102c is aligned so that the content 1020, or the location where content can be displayedup, is in line with the edge 1024. Within the preferred arrangement is basedThe focus on the assumption that a user viewing the content is at the adjacent edgewhich symbol is placed. In other arrangements, it may be optimal or preferreddirection be another. Furthermore, within an arrangement, when a symbol has been dragged to the edge of the displayand optionally directed into a preferred position, it can be further placed, once selected,at a predetermined distance from an edge of the display. A fixed distance can be setin between the surface edge and the edge of the symbol closest to the edge. The mode of operation illustrated with reference to Figure 1 (a) to 1 (c) and Figure 2 may bean automatic mode of operation, which is started during the initialization of a system or startupof an application, or it can be a mode selected from a menu of oneuser. It should be appreciated that additional images or graphic elements may be displayed on itthe touch-sensitive surface while the step described in Figure 1 (a) to 1 (c) and Figure 2carried out with a working method with selection of symbols. Referring to Figures 4 (a) to 4 (d), another example of an arrangement is described.mang, according to which the association of a symbol with a user is defined.10152025303512As shown in Figure 4 (a), the symbols 102c and 102b have each been placed againstthe edge of the touch screen 100. As shown in Figure 4 (b), the user with the hand 104c selects the symbol 102c, sothat the symbol 102c is preferably highlighted. The symbol 102c can at this point alsoconsidered active.In this arrangement described, the symbol 102c is not at this stageassociated with a specific user. Admittedly, a user has selected the symbol andpulled it to the edge of the touch-sensitive surface, but no relationship has been definedbetween the symbol and a specific user. According to this mode of operation and after the symbol 102c has been selected with the hand 104c becomessymbol 102c is highlighted, and the display on the touch screen is adjustedfurther to display a virtual keyboard 112 and a display area 110. The virtualthe keyboard 112 and the display area 110 can then be used for the user tobe able to select the symbol 102c and enter additional details to identify himself. The display 110 can show fields that allow a user to log in to oneregistration system, or which allows a user to identify himself in another wayby entering a name or other identifying information. After a suitableregistration or other procedure, the symbol 102c is modified, as shown in Fig. 4 (d),to include a user ID, which is denoted by the reference numeral112c. The user identity may be the identity of a user known by oneregistration system or a temporary identification, provided by a user to customizesymbol 102c. This user identification process does not require the symbol to stopthe edge of the display. This association of the user can take place regardless of the location ofthe symbol. The choice of a symbol in any arrangement does not necessarily have to berequire that the symbol be dragged to the edge of the display and not be held in a fixed positionat the edge of the display. For example, a user can simply touch a symbol and thenmake one or more additional inputs by touch. and this or these additional inputsby touch is then associated with this symbol.In the above example, the process of providing auser identity as a distinct step, when a user selects the symbol with a touchafter the symbol has been placed within a symbol selection area. lnom alternative andPreferred arrangements may require a user identity to be associated with the symbol- or in some embodiments some other type of identity- come automatically whenthe symbol has been selected for association. Thus, in the example of Figure 4, the keyboard can10152025303513112 and the display area is shown immediately when the user removes the fl contact withthe symbol after dragging it to the edge of the display area. Other methods of providing a user association may be provided. lFigure 5 illustrates an arrangement where a set of 152 of user identification learning possibilities150a to 150d are displayed after selecting a user association symbol. The possibilities150a to 150d may identify registered users of the system, eg by nameor with photos, so that a user can choose their own identity. The choice of oneuser identity in this way may require confirmation by entering a password. Options 150a to 150d may identify avatars so that a user can select oneanonymous identity. The number of user identification options displayed canrepresent all possibilities or only a small number of available possibilities, wherebyadditional possibilities are shown by searching left or right in the displayedset 152. Several identification options can thus be provided for selection. TheseOpportunities may include choosing from a number of avatars or a number of images, etc. Theseopportunities may allow a user to be identified anonymously. The selection of another displayed icon after the selection of a symbol, for example the selection of oneidentity of the user, such as the choice of one of the possibilities 150a to 150d in Figure 5, canthus confirming the user's choice of user identification options.The foregoing describes examples where a selected symbol is associated with oneuser. However, the defined association does not have to be oneuser association. Another type of association can be attributed independently of or in additiona user identifier. For example, a symbol may be associated with a group identityrather than with a user identity. With further reference to Figure 5, a set152 of group identification options 150a to 150d are shown. The group identification options 150a to 150d can each identify predefinedgroups, and the user can select a predefined group from the options tojoin this group. The groups can simply be displayed as a color choice or with someone elseform of group identification. In this way, the user can choose to associate a symbol witha group, which is an example of providing an association for a symbol that does notis one with a specific user. Thus, selected icons or symbols can be grouped regardless of whether they areassociated with a user. The association of a symbol with a group can be denotedby displaying an identifier on the symbol. For example, a symbol may appear in a certain color,or at least in part with a certain color, to denote association with a group. Allsymbols associated with a common group can be displayed with the same color orappropriate indication shown.10152025303514The grouping can be determined by a user who selects a group identifier forits displayed icon as mentioned above with reference to Figure 5. A user selection or initialization process may not have been completed untiloptions for each of a plurality of selected icons, such as user identity options,has been confirmed. For example, the user registration process may last until oneuser association has been defined for each selected symbol. In general, there can be onethe requirement that an appropriate association (with a user or other) be defined for each symbolbefore an initialization or registration process is completed. Initialization or registrationthe process can be associated with a specific application, with the application progressing firstwhen the registration or initialization has been completed. A specific example when an applicationmay continue only when all selected symbols have been associated with a user is further described inan example below with reference to Figure 9. Referring to Figures 6 (a) to 6 (d) and Figure 7, the use of selection anduser association of the symbols as described above for the identification oftouch input provided by a user associated with a particular user afterthat a symbol has been associated with a user within an example of an arrangement. In this arrangement described, it is assumed that a symbol has been drawn to an edge toselected for an association. After this selection, a user association is selected for the symbol. This is in accordance with the arrangements described above. Variousexamples of arrangements are described here in connection with such a described selection processand association of a symbol, but it will be appreciated that alternative procedures for selecting onesymbol can be used. As shown in Figure 6 (a), one user has selected one symbol 102c and anotheruser has selected a symbol 102b, the selection being obtained by dragging the respective symbolsto edges of the surface, and the symbols have been associated with a respective user identity. The initial step has thus been completed and user identities have been attributed. Like thatshown in step 1030 in figure 7, the symbols are shown on the surface. As shown in Figure 6 (b), a user uses his hand 104c to selectsymbol 102c. This selection is detected within a step 1032. As shown in Figure 6 (c), the symbol 102c after this selection can be highlighted, or itsappearance changes in any way, to show that it has been selected. Alternatively, no modesdial or highlight the symbol 102c. The preferred selection or modification ofhowever, the displayed icon for the symbol 102c indicates that this symbol has now been selectedand can be beneficial. After selecting the symbol 102c in step 1032, the user can with the hand 104c according towhat is highlighted in fi gur 6 (d) then perform touch steps on the surface 100. As shown inFigure 6 (d) associates the point of contact of the finger on the user's hand 104c with it10152025303515touch-sensitive surface 100 with a marker denoted by the reference numeral 108, whichcan be moved on the surface by moving the finger. As those skilled in the art will appreciate, it can touch inputprovided by the user with the hand 104c is generally used for selecting exhibitsimages, manipulation of displayed images, provision of input, such as notes anddrawings, etc. The user's finger contact can be used generally to interact withapplications, open and close applications, select and manipulate tools, etc. As suchshown in step 1034 in Figure 7, this input is detected and processed by the system in the usual manner,as far as is known. In accordance with this preferred arrangement, all touch inputs are associated withdetected by the touch sensitive surface 100 after selecting the symbol 102c with the symbol 102cas shown in step 1036 of Figure 7. All touch inputs are associated with the symbol 102cuntil the symbol 102c is removed within a further step, or until it is replaced by selecting oneanother symbol. A selected symbol can be removed simply by touching or selecting it, toothis after an initial election. In step 1038 in Figure 7, it is determined whether a new symbol has been selected. If that is the casesteps 1034 and 1036 are repeated, but with inputs associated with the newly selected symbol. If no new symbol has been selected, it is determined within a step 1040 whether itthe current symbol has been deselected. If the symbol has not been deselected, steps 1034 and are repeated1036 with input still associated with the current symbol. About itthe current symbol has been deselected, the procedure returns to step 1030 to await the selection of onenew symbol in step 1032. Referring to Figures 8 (a) to 8 (b), the process by which theinput can be associated with another user input. As shown in Figure 8 (a), the symbol 102c is highlighted, indicating that this symbolis choosen. A finger on another user's hand 104b then touches the symbol 102b. As shown in Figure 8 (b), the symbol 102c is then no longer highlighted, whilesymbol 102b is highlighted. All touch inputs detected after the selection of the symbol 102bis then associated with the symbol 102b. This is represented in the flow process in Figure 7 ofstep 1038. Touch input that is detected when no symbol has been selected can be used in certain arrangementsignored. This means that touch input can only be processed if it can be associated witha symbol. In other arrangements, touch input can be processed even if no symbolis selected, and such input can then not be associated with any symbol. As discussed further below, each selected symbol can be associated witha menu. The menu can be displayed by selecting the symbol. The menu can, for example, enable10152025303516that the user joins a new group or otherwise changes the settings that areassociated with the symbol. When an application or operating system is working, user interface tools canthe average is allocated to a user if they are derived from a symbol associated with a user. For example, if a toolbox is displayed when selecting a user symbol, and a specific toolselected from this toolbox, all use of this tool, and all input is associated withassociated with this tool, with this symbol and thus with this user. Oneexample is when a user selects a keyboard from a menu and a virtual keyboardthen appears somewhere on the surface. All input made with the virtual keyboardis then associated with the user. An example is given above with reference to Figure 4,where such a keyboard is associated with a symbol during user registration, butmore generally, such an association can be made by any use of oneapplication. A symbol does not have to be associated with a user. A symbol providedholds an anchor point with which input can be associated and can but need not betied to or associated with a specific user. within an option can be a symbolfor example, instead of being associated with a user be associated with onedatabase. If the symbol is selected, all input is then registered, which is then detected in the database. In another arrangement, a procedure is provided for cooperation at atouch-sensitive surface, which includes determining the position relative to the surface of amultiple users; and allocating input to one or more users depending on the fixedset positions, based on the location of this input in relation to the determined onesthe positions. The position of the number of users in relation to the surface can be determined on the basisof the location of a symbol associated with a user. With reference to Figures 8 (a)and 8 (b), for example, it may be assumed that a user associated with the symbol 102bis adjacent to the edge of the surface where the symbol 102b is located, and is also locatedthe user associated with the symbol 102c adjacent to the edge of the surface where the symbol102c is located. All input detected at the surface within a region adjacent to any of the symbols102b and 102c are therefore associated with the respective users. Thus, one area candefined around or together with a symbol, and all inputs associated with itarea is assigned to the user associated with this symbol. Thus determinedinput be close to a localized position for a user if it is within a certainarea around the located position of the user, wherein the position of the userpreferably determined by the position of the symbol associated with itthe user.10152025303517As can be seen from the description above, a number of symbols canassociated with a corresponding majority of users in such a way that input from onemultiple users can be tracked. In a work environment for collaboration, the system can thus track everythinginput provided and also track the users who provide this input. In this waythe contribution of each user to a collaborative task can be determined. As stated above, the determination of from which userinput received is made in a number of ways. It can be determined on the basis of the symbol thatwas selected before this input was made. It can be based on the area of the work surface withinwhich this input was made. As described above, the range can be determinedon the basis of a determination of the location of the user. Alternatively, and according to whatwill be discussed further in the following, there may be areas on the workspace that aredefined or reserved for use by certain users, so that all input within themareas determined to originate from a particular user. As stated above, a symbol may be associated with the useris moved around the work surface to be displayed in different positions by selecting the symbol throughtouch and dragged around the work surface. In this way, a user can move that symbol withwhich he or she is associated with when he or she changes position around the table in order tofor example, dynamically adjusting a work surface area on the surface with which he or she isassociated. The working surface area of the touch-sensitive display can be divided according tothe number of users and the associated number of symbols. If, for example, two symbols haveselected and associated with users and the symbols are located at opposite edges ofthe work surface, the work surface can be divided into halves along the middle, each half being associated withrespective users, so that all inputs in each half are associated with these respectivelyuser. Thus, an example of an arrangement includes a method forcollaboration on a touch-sensitive surface locating a position for a foreign user inrelation to the touch-sensitive surface depending on the location of a corresponding majority ofdisplayed symbols, associated with the number of users, and association of input onthe touch-sensitive surface with a specific user depending on the location of the inputis located next to the localized location of the user. Within this arrangement, the method may include determining the numberuser.in another arrangement, a GUI icon that includes a symbol can be formattedand displayed in such a way that other GUI icons can be superimposed. Such a symbol canis considered a container, in which additional GUI icons can be placed or included. All GUIIcons placed or included in the symbol are associated with this symbol and associated10152025303518thus with any identifiers associated with this symbol, such as auser identifier.In an alternative to this arrangement, the container associated withthe symbol is defined as a specific area of the workspace, associated with the symbol,instead of an displayed icon. Within the example described above, therework areas on the touch screen are reserved for some users onbased on the position of a user's symbol, this workspace can be defined asa container, any GUI icons or objects placed in the container being associated with onecertain user. All GUI icons that are in or dragged to the specific area or inin the container is then defined as having a relationship with this user. Thus, within an arrangement, a procedure for cooperating with one is providedtouch-sensitive surface, which includes the definition of an area associated with the user;and associating all displayed information in this area with this user. As mentioned above, the area can be defined as a physical areaof the touch-sensitive surface, or it can be defined by a Greek icon such asdisplayed on a touch-sensitive surface regardless of the physical area or position. In some arrangements, the choice of a GUI icon may appear inside the containerleading to the display of a menu or tool associated with this GUI icon, wherebythis menu or tool can be displayed within an area outside the container, ieoutside the user's reserved workspace or outside the symbol's GUI. Thisdisplayed icon is hereby associated with the user, and all input detected at thisdisplayed icon, even if it is outside the range of the user's symbol orthe user's reserved workspace, is associated with the user. Thus comes allany "widgets" generated as a result of interaction with the user's containerto be automatically associated with the user via its symbols. This principle applies to all objects or tools that are generated or displayedfollowing a selection based on a symbol, for example from a symbol menu. All objects ortools selected are associated with this symbol, regardless of its position on the work surface. Within yet another example of implementation, a procedure is provided forcollaboration in a system that includes an interactive touch screen display, which includesproviding a plurality of user icons representing a corresponding pluralityuser; and dividing the interactive display with touch screen into a pluralitywork areas, each work area being arranged to receive input from one or morespecified users, the users being specified using the location of theirassociated user icon associated with the workspace. Thus, as described above, work areas canthey fi are displayed on the interactive display surface, which represent a sub-area of the whole10152025303519display area. The sub-areas can then be associated with a specific user throughthat the area is associated with a symbol with which this user is associated. Theseareas, they can be dynamicized depending on the current location of a symbolassociated with a user. These areas may be fixed depending on the numberusers as workers in the system and the corresponding associated number of symbols. For example, the system can divide the work area into a number of equal work areasdepending on the number of user symbols defined and then automaticallylocate the user symbols to the positions where the users need to be in order towork in the areas of the workspace they have been assigned. For example, in an arrangement where two symbols have been associated withusers that the system can divide the workspace into two parts (along the middle) and then placeeach symbol at opposite edges of the table in relation to the center line. Each user touchesthen to a position at the edge of the table next to their symbol, and they can identify theirssymbols using the identifying information displayed on them. Within sucharrangement, the system thus determines the best working arrangement for the surface andusers, instead of the users themselves choosing their working position. Preferably, the work areas are defined by their proximity to parts of the edges ofthe interactive display area. Workspaces can thus be defined so that they have a certain size,such as a size corresponding to a typical area of a workspace of a personal computer witha touch screen, and a suitable work area is adjacent to orcoincides with a user's symbol. Preferably, a plurality of displayed icons are provided, each associated withone of a plurality of users, the display surface with touch screen consisting of at least onework area which is a subdivision of the whole area, whereby the detection of input within asuch workspace is associated with a user depending on the displayed icon, andthis user is located in, next to or near the work area. Within another arrangement, a procedure is provided for co-operation with onetouch-sensitive surface, which involves each of a plurality of users being associated witha group; And input from the groups is tracked. The association or identifier provided for a symbol is notnecessarily a user identifier. As mentioned above, identi-katorn represent another association. An example of this is the identification of aassociation with a group. For example, when a user selects a symbol during a registrationor initialization process, the association options shown may includepresentation of one or more group associations. The user can then select onegroup association for the symbol, either in combination with or independently of someone else10152025303520association. Within this arrangement, one or more pre-designated groupsprovided, or one or your groups may have already been created by other usersat the time a new symbol is selected. Thus, a list of one or more available is displayedgroups up for a user when selecting a symbol as available association information. The user can select one or more groups, and the symbol is then associated with this orthese groups. The association of the symbol with one or more groups can be indicated on the GUI icon forthe symbol by showing a Greek representation of a group, a name of agroup, or for example by displaying the symbol in a certain color, this color beingassociated with a group. Thus, each symbol can be associated with a group, in addition to or independently ofit is associated with a user identity. One or your groups can be defined by an application and thus displayed to oneusers when selecting a symbol as one or your default groups. The user can thusjoin one or more defined user groups. Within a preferred arrangement, inputs are traced in connection with symbols which areassociated with a joint group to determine the contribution of individual usersto the work of the group. A symbol can thus be associated with a group either with or without a useridentification, so that input can be given to the group either anonymously or with identification ofthe user. In another arrangement, agreement is required between a number of users. Within oneprocedure for cooperation thus requires collective agreement. Collective agreement is provided inin connection with multiple inputs being selected on the same user interface.within an arrangement, a procedure is provided for cooperation on asensitive interactive display, which includes displaying a plurality of elements in the user interfaceup, each associated with one of a plurality of users, one step of an applicationrunning on a computer system connected to the touch-sensitive interactive display depends ona selection made by each user at each element of the user interface. Such a step may depend on the choice made by each usersame choice. An example of this collective agreement is within the initialization orregistration phase as described above. In an event where registrationor the initialization phase includes a requirement that a user select a symbol bytouch it and drag it to the edge of the touch screen, indicating thatthe user wishes to select this symbol for an association, can initialization orthe registration process does not end to allow further use of applications10152025303521until all such selected symbols have been associated with a user identity. The choice thatmade by each user may thus include a choice of a user identity, and only whenall users have selected a user identity can the initialization or registration processcontinue at the beginning of an application. An example might be when an application that is enabled is a game with fl erauser. A number of users select symbols to be able to participate in the game as individualsplayer. The application does not allow the game to start until each user has selected onesymbol provides an identity for itself. Before each selected symbol has been given an identitythe game can not start.In other arrangements, the initialization or registration process cannotcompleted until another appropriate association has been provided for each symbol. Thethus, it is not necessary for each user to provide an identity for the symbol. As mentioned above, each symbol may be required to be associated with onegroup. In alternative arrangements, it may thus be the case that an association with a groupmust be defined for each symbol before the application can proceed to the next stage. As has been discussed above, arrangements are provided according to whichthe location of a user can be determined depending on the position of a symbol associatedwith the user. The user can move around the work surface and drag his associate with himsymbol. The system can thus monitor a user's movement around the work surface and be adjustedin such a way that input is associated with the user depending on the user's currentestablished place. An arrangement thus provides a method for tracking a user'sposition relative to an edge of a touch-sensitive interactive display surface, which includesthat for each user a user cone is provided which appears on the display surface, whereby itthe current position of the displayed user icon represents the user's currentposition. The movement of the user icon, or symbol, for a user representsthe movement of a user's position. Input detected in an area close bythe user icon is associated with the user associated with the user icon. A user icon associated with an identifier, such as a useridentifier, can be dissociated or deselected. In an arrangement, a symbol is drawnto an edge of the display surface for associating the symbol, this dissociation or thisselection is obtained simply by the user dragging the symbol away from the edge, towardsthe center of the surface, and releases the contact. After such an event, the dissociation orthe deselection takes place automatically. Alternatively, the choice to dissociate or opt out ofheld as a menu option when selecting the symbol. After the dissociation or deselection10152025303522the displayed symbol can be removed from the display or displayed within a special partof the workspace, where it is available for selection by another user. With reference to fi gur 9 (a) to 9 (e), an example of a registration orinitialization process that includes features that have been described above. As illustrated by Figure 9, a plurality of symbols, denoted by reference numerals, are shown.numbers 202a to 202f, generally located on and around a ring 200 shown.shaped icon 204 is displayed in the center of the ring, and this gives as will be describedfurther in the following an indication of user registration or initializationthe process. According to the arrangement in Figure 9 (a), the symbols 202a to 202f have no associationdefined with them, but they are available for selection. Those skilled in the art will appreciate that suchsymbols can appear on the display in any position. The circular user icon 204 in the center is displayed to show a neutral"Face", which indicates that the user initialization or registration does not yet havecompleted and that an application associated with the selection of symbols can not continue. As shown in Figure 9 (b), two users each select from the symbols 202a to 202g. A first user selects the symbol 202a and generally draws the symbol to the right edge(as shown in the figures) of the display surface. A second user selects the symbol 202band drag the symbol to the lower edge (as shown in the figures) of the display. As shown in Figure 9 (b), when the symbol 202b has been dragged to the displayedge and preferably aligned appropriately and positioned relative to the edgeanother icon 206b is displayed next to the symbol 202b, and a further icon set 212bdisplayed next to the symbol 202b. The displayed icon 206b is a "traffic light icon", which has two "lights" that can be displayedup on it, of which only one can shine at a given time. A position 208b denotes redlight, and a position 201b denotes green light. While the user's choice of an identityassociated with the symbol 202b in progress, the traffic light 206b shows a red light 208b. When the useris sure he or she has completed his or her registration and touches the displayed icon206b, the displayed light changes from red light 208b to green light 210b, which means thatthe user has completed his user registration. Likewise, the symbol 202a is associated with a traffic light 206a, which has a red light208a and a green light 210a and is controlled in the same manner as the traffic light 206b. As further illustrated in Figure 9 (b), the set includes displayed icons212b a number of avatars. As shown, the plurality of avatars include, for example, a panda,a frog, a cat and an owl. The user can browse through the available avatarsby moving his finger from left to right in the icon set 212b, so that more avatarsmay be available for display than shown in Figure 9, and only a small number are shown10152025303523up at the same time to avoid taking up too much display space on the surface. The usercan then select an avatar by touching the avatar with his finger, so that the avatar thenbecomes visible in the center of his or her symbol 202b. As shown in Figure 9 (b) hasthe user has selected the frog avatar, so that an avatar representing the frog is displayed in the symbol202b. In this way, the user can identify himself anonymously, but in such a way that oneunique identity is associated with this. As also illustrated by Figure 9 (b), the user associated withthe symbol 202a also has a displayed set of user icons 212a, which according to whatas illustrated in 9 gur 9 (b) includes presented photographs of individuals. The user can choosethe photograph of an individual representing him or her, after which the appropriatethe displayed photo is displayed in the center of the symbol 202a. The user can also scroll fromleft to right among the set of displayed icons 212a, and the photographs ofusers - who can be registered users of the system - can be displayed, as wellavatars and other possibilities to define an association of the symbol. As illustrated in Figure 9 (b), each of the users has selected an displayed icon fromtheir respective sets 2121b and 212a, but the traffic lights 206b and 206a for eachof the users are in red, as indicated by a light in position 208b and 208a. As shown in Figure 9 (c), the first user completes the selection of their user iconby touching the traffic light icon 206b so that the displayed light changes to the greenthe light in position 210b. The options 212b are then no longer displayed, and the selected onethe possibility is shown in the symbol 202b and is as shown an avatar of a frog. At the same time, the other user maintains the traffic light 206a in the red light position, according towhat is indicated by the light in position 208a. It can be seen that during the whole process in Figures 9 (b) and 9 (c) what is shown is retainedThe "face" of the icon 204 in the center of the screen in a neutral position. Referring to Figure 9 (d), the first user again touches the icon 206b tohis or her status should return to incomplete, indicating that a user IDis selected. The traffic light shown is thus the red light in position 208b, and the selection icons212b is displayed again. As shown in Figure 9 (d), the icon is then adjusted so that no useridentification is displayed along with it. The same applies to the other user, associatedwith the symbol 202a, that the displayed icon set 212a is changed to displayavatars, because the user has scrolled left or right to view furtherpossibility. The user's traffic light 206a is maintained so that it shows the red light in position208a. The displayed icon 204 is maintained with the "face" set to neutral. Referring to Figure 9 (e), the case where the first user has selected is then displayeda desired user identity, as indicated by the green traffic light at position 210bat the traffic light 206b. As shown in Figure 9 (e), this is the choice of a frog avatar in the symbol10152025303524202b. Furthermore, the other user, associated with the symbol 202a, selects the traffic light 206ato change the displayed traffic light to green light in position 210a. Since both users have now indicated that they have completed the selection of a useridentification, the image on the icon 204 changes to a positive image, indicating that all symbolshas been associated with users and that users have indicated that this selection has been completed.the initialization or registration process as such has been completed, and one or fl eraapplications can be run. Referring to Figures 10 (a) to 10 (d), an example of a step within ~ is now illustrated~ the use of an application for the selection of user identities, as describedwith reference to fi gures 9 (a) to 9 (e). Figures 10 (a) to 10 (d) show a drawing application running on the interactive display;the surface, to which users associated with the symbols 202b and 202a can provideinput. A number of lines are illuminated, which are shown on the display surface. As it turns out, everyone isof the symbols 202a and 202b associated with a correspondingly displayed tool menu220a and 220b. As shown, different tools can be displayed in the menu, but only onesubgroup of the available tools can be displayed simultaneously. The user can thus seeadditional tools that can be selected by scrolling from left to right in menus 220a -and 220b. As shown, the tools available may include, for example, a ruler and aprotractor. A user selects a tool by touching the icon displayed for ittools he or she wishes in his / her menu 220a and 220b respectively. As illustrated in Figure 10 (b), the tool menu 220b is no longer displayed becausethe user associated with the symbol 202b has selected a specific tool, in this casemore specifically a protractor. Thus, as illustrated in Figure 10 (b), a protractor 222 is shown onthe display surface, and together with the protractor, a small icon appears representing ituser who has selected it, which in this example is a copy of the symbol withthe user's avatar, denoted by the reference numeral 224. The protractor 222 is shownalso an icon 226, which indicates a means for the user to deselect the tool. Like thatillustrated in Figure 10 (b), the other user associated with the symbol 202a does notselected any tool, and the user's tool menu 220a is therefore still displayed. As illustrated in Figure 10 (c), the user 202a has now selected a tool, and thereforethe tool menu 220a is no longer displayed. User 202a has also selected a protractor, whichdenoted by the reference numeral 230. The protractor 230 has a copy of the symbol 202a,displayed with the icon 232, and an icon 234 with which the protractor can be deselected. As also shown in Figure 10 (c), the first user, associated withsymbol 202b, now also selected a keyboard 240, and the keyboard is displayed in the same waywith an icon 242, which is a copy of the symbol 202b, and an icon 244 with whichthe keyboard can be deselected.10152025303525In accordance with the principles described above, all inputs are associateddetected and associated with the protractor 222 or the keyboard 240 with ituser associated with the user icon 202b. All input detected asassociated with the protractor 230 is associated with the user associated withsymbol 202a.layer 10 (d) illuminates an icon 246 which has a number (the number 140). Thisrepresents the result of a calculation, performed using the keyboard 240. The keyboard 240 can simply be a calculator. This showed the answer, whichdenoted by the reference numeral 246, can be drawn in place to become a marking on a calculatedAngle. The application can determine that the answer has been provided by the first user,associated with the symbol 202b, as it has been calculated usingthe keyboard 240. Referring to Figures 11 (a) to 11 (d), another example is illustrated in which itapplication running is a game. As illustrated in Figure 11 (a), the user who isassociated with the symbol 202b a game menu 250. As in the previous can furthergames are displayed by scrolling in the game menu 250 left and right. The user who isassociated with the symbol 202b selects a chess game, and the chessboard of the game is displayedas an displayed icon, denoted by the reference numeral 252. As shown in 11 gur 11 (a)the icon associated with the symbol 202b also appears on the chessboard, according to whatas indicated by the reference numeral 254. This is preferably a copy of the symbol 202b. As illustrated in Figure 11 (b), the user associated with the symbol draws202a his symbol 202a to the chessboard to indicate that he or she wants to play chess. As illustrated in Figure 11 (c), the symbol 202a is then returned to a "parked" position.at one edge of the display surface. A copy of the symbol 202a is then displayed on thethe board, as indicated by the reference numeral 256. The application "now knows"the identity of the two players to play chess. As illustrated in Figure 11 (d), additional users may perform additional activities.while the chess game is in progress between the two users associated with the symbols202a and 202b. As illustrated in Figure 11 (d), an associated user is playingwith a symbol 202c a game identified by the reference number 258. "Ownership" tothe game is indicated by a copy of the symbol 202c, denoted by the reference numeral 260,appears on the game. Likewise, a user associated with the symbol is 202dassociated with a clock 262, and the association is confirmed by a copy of the symbol202d is displayed as image 264 on the watch. Thus, all input that is associated is associatedwith the displayed game 258 with the user and connected to the symbol 202c, and all inputassociated with the clock 262 is associated with the symbol 202d and the user who isassociated with this.10152025303526Referring to Figures 12 (a) to 12 (d), another example of one is now describedapplication using multiple users with image editing symbols. As illustrated in Figure 12, there are four symbols, denoted by the reference numerals270a to 270d, each of which has been associated with a user. In the example of Figure 12 (a) haseach user selected a user ID using an avatar. For the symbol 270a, a menu 272a is also displayed with various connection optionswith image editing. Here, too, the user can possibly show additional possibilities throughto scroll through the menu on the left and right.Figure 12 (a) also illustrates a number of images, denoted by the reference numerals 274ato 274e, generally displayed on the display surface in such a way that they overlap. All users have tool options provided in a menu, such as onemenu 272a for the symbol 270a with a workspace tool. The work surface tool is denoted byreference numeral 278 for menu 272a. Figure 12 (b) illustrates that a workspace tool has been selected for each of the symbols.270a and 270b, the work surfaces being shown on the interactive surface and denoted byreference numerals 280a and 280b. As in the previous examples, the selected workingsurfaces an association with the respective identified symbol by making a copy ofthe symbols 270a and 270b are provided on the work surfaces 280a and 280b, respectivelydenoted by the reference numerals 282a and 282b. As illustrated in Figure 12 (c), a user pulls using a touch switchone of the displayed images, denoted by the reference numeral 274d, over its work surface. The displayed image is then associated with the work surface to which it has been drawn. Within thatexample shown in Figure 12, the image 274d is drawn over the work surface 280a and then associated withsymbol 270a.Figure 12 (d) shows that the image 274d is now displayed within the work surface 280a forsuch as belongs to the user associated with the symbol 270a. As also illustrated in Figure 12 (d), a work surface 280c and 280d are now shown.associated with the symbols 270c and 270d, and this association is indicated by acopy of the symbol is displayed within the workspace, as indicated byreference numerals 2820 and 282d. Any user can choose a suitable image throughto touch this image and drag it over its workspace area so that this image will thendisplayed within this workspace for further editing. Within the example of Figure 12 (d)thus, image 274a is drawn into work surface area 280d, image 274c is drawn into work surface area 280cand the image 274e is drawn into the work surface area 280b. When an image has been dragged into a workspace area, all editing of it is performedimage within the work surface area. When the image editing is completed, the user can select the image1027and drag it to an area of the display surface outside the work surface to the association betweenthe image and the corresponding symbol shall cease. All examples and embodiments described here can be combined in different ways andis not mutually exclusive. The invention has been described herein by reference to particular examples andexamples of embodiments. Those skilled in the art will appreciate that the invention is not limited tothe details of the specific examples and exemplary embodiments described. Manyother embodiments are conceivable without departing from the scope of the invention, which is definedof the appended claims.
权利要求:
Claims (147) [1] A method of cooperating at a touch-sensitive surface in a cooperating input system, the method comprising a. Selecting an displayed icon by touching the touch-sensitive surface; b. providing additional input by touching the touch-sensitive surface, this additional input being associated with the selected icon. [2] The method of claim 1, wherein said additional input is associated with the selected icon until another icon is selected. [3] The method of claim 1, wherein said additional input is associated with the selected icon until the selected icon is deselected. [4] The method of claim 1, further comprising the step, after selecting the displayed icon, of associating the displayed icon with a user. [5] The method of claim 4, further comprising a step of a user registration process in selecting the displayed icon for identifying the user. [6] A method according to any one of claims 1 to 5, wherein the step of selecting an displayed icon comprises selecting the icon and dragging the icon to an edge of the touch-sensitive surface. [7] The method of claim 6, wherein an icon is selected in response to demonstrating that the icon is drawn to the edge of the touch-sensitive surface. [8] A method according to any one of the preceding claims, wherein a selected icon after the selection is aligned relative to an edge of the touch-sensitive surface. [9] A method according to claim 8 when dependent on claim 6 or 7, wherein the selected icon is aligned relative to the edge to which it is drawn. [10] A method according to any one of the preceding claims, wherein the icon at the selection is placed at a predetermined distance from an edge of the touch-sensitive surface. [11] A method according to any one of claims 4 to 10, wherein after the selection of the displayed icon a plurality of identification possibilities are provided for selection. [12] A method according to claim 11, wherein the possibilities comprise a number of avatars or a plurality of images. [13] A method according to claim 11 or 12, wherein the selection of yet another displayed icon confirms a user's choice among the possibilities. [14] A method according to any one of claims 11 to 13, wherein when a plurality of icons are selected by a plurality of users, a user selection process is maintained until all users have confirmed the selection of all options for each of the plurality of selected icons. [15] A method according to any one of the preceding claims, further comprising repositioning the displayed and selected icon on the touch-sensitive surface. 10 15 20 25 30 35 29 [16] A method according to any one of the preceding claims, wherein a plurality of users select a plurality of displayed icons. [17] The method of claim 16, wherein two or more of the plurality of selected icons are grouped. [18] The method of claim 17, wherein the grouping is denoted by an identifier displayed on the grouped user icons. [19] The method of claim 17 or 18, wherein the grouping of the user icons is determined by a user selecting a group identifier for their displayed icon. [20] A method according to any one of claims 1 to 19, wherein a selected user icon is deselected by dragging the displayed icon away from the edge of the touch-sensitive surface. [21] The method of any one of claims 1 to 20, further comprising the step of defining at least one active area on the touch-sensitive surface, the active area being reserved for use by one or more users. [22] The method of any one of claims 1 to 20, further comprising the step of defining at least one active area on the touch-sensitive surface, the active area being reserved for use by one or more applications. [23] The method of any of claims 1 to 20, further comprising the step of defining a plurality of active areas on the touch-sensitive surface, the active areas being defined by selecting user icons. [24] A method according to any one of the preceding claims, wherein each selected icon is associated with a menu. [25] The method of claim 24, wherein the menu is displayed when the icon is selected. [26] The method of any of claims 1 to 25, further comprising the step of determining a position of a user associated with the displayed icon. [27] The method of claim 26, wherein the position of the user is determined by the position of the displayed icon. [28] A method according to any one of claims 1 to 27, further comprising providing a plurality of displayed icons, each displayed icon being intended for association with a particular identity. [29] The method of claim 28, wherein the identity is a user identity. [30] The method of any of claims 1 to 29, further comprising selecting an option displayed after a selection of an displayed icon, and associating the selected option with that icon. [31] The method of claim 30, wherein the option is a tool selected from a tool menu, the tool being displayed on the touch-sensitive surface, all inputs derived from using the tool being associated with the displayed icon. 10 15 20 25 30 35 30 [32] A method according to any one of claims 1 to 31, wherein the displayed icon represents a database. [33] A method of cooperating on a touch-sensitive surface, comprising determining the position relative to the surface of a number of users; and inputs are allocated to one or more users depending on the determined positions based on the location of that input relative to the determined positions. [34] The method of claim 33, further comprising tracking the contribution of each user. in [35] The method of claim 34, further comprising tracking the contribution of each user to a collaboration task. [36] A method according to any one of claims 33 to 35, wherein the position of the user is determined depending on the position of an displayed icon associated with the user. [37] The method of claim 36, wherein the displayed icon is placed by the user. [38] The method of claim 37, wherein the displayed icon is placed along an edge of the touch-sensitive surface. [39] A method of cooperating on a touch-sensitive surface according to any one of claims 1 to 32, comprising a. Locating the position of a plurality of users in relation to the touch-sensitive surface depending on the location of a corresponding plurality of displayed icons associated with the plurality of users. ; and b. input at the touch-sensitive surface is associated with a specific user depending on the location of that input near the user's located location. [40] The method of claim 39, wherein an input is considered to be close to a localized position of a user if it is within a certain area around the user's localized position. [41] The method of claim 39 or 40, wherein the step of locating a position for a plurality of users comprises determining the position of the users relative to the edges of the touch-sensitive surface. [42] The method of any of claims 39 to 41, further comprising the step of determining the number of users. [43] The method of claims 39 to 42, further comprising the step of tracking input from each user. [44] The method of claim 43, further comprising providing information about the contribution provided by each user. [45] The method of claims 39 to 44, wherein a system event is determined by a collective response from all users. [46] A method of cooperating on a touch-sensitive surface, comprising defining an area associated with a user; and b. any information displayed in this area is associated with this user. [47] The method of claim 46, wherein the area is defined by a physical area of the touch-sensitive surface. [48] The method of claim 46, wherein the area is defined by a graphic icon displayed on the touch-sensitive surface. [49] A method according to any one of claims 46 to 48, wherein the step of associating comprises selecting displayed content and dragging it into the designated area. [50] A method according to any one of claims 1 to 45, further comprising associating all possible functions with the displayed icon which is in the same place as this. [51] The method of claim 50, wherein a function is associated with the displayed icon if another displayed icon representing that function is positioned to coincide with the displayed icon. [52] 52. A method for cooperating on a touch-sensitive surface, comprising a. Each of a number of users being associated with a group; and b. input from the group is tracked. [53] The method of claim 52, wherein the step of associating each of a plurality of users with a group comprises each user joining a group. [54] The method of claim 52 or 53, further comprising providing a plurality of user groups. [55] The method of claims 52 to 54, wherein the group is defined by an application and the association of a user with a group comprises the user joining the defined group. [56] The method of any of claims 1 to 51, further comprising providing a plurality of displayed icons, each associated with one of a plurality of users, wherein two or more displayed icons are associated with an array. [57] The method of claim 56, wherein a common array of two or more displayed icons is denoted by a common element of the displayed icons. [58] The method of claim 56 or 57, wherein one or more predetermined groups are provided, and further wherein an association of an displayed icon with an array comprises associating the displayed icon with a predetermined QFUPP. [59] A method according to any one of claims 56 to 58, wherein input associated with an displayed icon associated with a grouping is tracked to determine an individual's contribution to a group. 10 15 20 25 30 35 32 [60] A method of cooperating on a touch-sensitive interactive display surface, comprising displaying a plurality of user interface elements, each associated with one of a plurality of users, wherein one step of an application running on a computer system coupled to the touch-sensitive interactive surface is dependent of a selection made by each user at each user interface element. [61] The method of claim 60, wherein the step of an application running on a computer system coupled to the touch-sensitive interactive display is dependent on the choice made by each user at each user interface element being the same choice. [62] The method of claim 61, wherein the selection made by each user comprises a selection of a user identity. [63] A method according to any one of claims 1 to 59, further comprising providing a plurality of displayed icons, each for association with a plurality of users, a step of an application running on a computer system coupled to the touch-sensitive surface depends on the input associated with each of the fl number of icons. [64] The method of claim 63, wherein the step of the application is dependent on each of the plurality of displayed icons being associated with a user. [65] A method of tracking the position of a user relative to the edge of a surface of a touch-sensitive interactive display surface, comprising providing for each user a user icon displayed on the display surface, the current position of the displayed user icon representing the current position. at the user. [66] The method of claim 65, wherein movement of the user icon of a user represents movement of a user's position. [67] The method of claim 65 or 66, wherein input detected within an area in the vicinity of the user icon is associated with the user associated with the user icon. [68] A method according to any one of claims 1 to 65, wherein the displayed icon is associated with a user, wherein the current position of the displayed icon represents the current position of the user in relation to the touch-sensitive surface. [69] The method of claim 68, wherein the position of the user is determined depending on the displayed icon, which is located at or near an edge of the touch-sensitive surface. [70] A method of collaborating in a system comprising an interactive display surface with a touch screen, comprising providing a plurality of user icons representing a corresponding plurality of users; that the interactive display surface with touch screen is divided into a number of work areas, each work area being arranged to receive input from one or more specified users, and the users being specified by the location of their associated user icon, associated with work area. [71] The method of claim 70, wherein the working areas are defined by their proximity to portions of the edges of the interactive display surface. [72] A method according to any one of claims 1 to 69, further comprising a plurality of displayed icons, each associated with one of a plurality of users, the touch screen display area consisting of at least one working area which is a subdivision of the entire area, the input being detected within such a workspace is associated with a user depending on whether the displayed icon associated with that user is within the workspace. [73] The method of any one of claims 1 to 72, further comprising selecting a tool or object associated with the displayed icon, wherein any possible interaction with the tool or object is associated with the displayed icon. [74] The method of claim 73, wherein the displayed icon is associated with a user and the interaction with the tool or object is associated with the user. [75] A method according to claim 73 or 74, wherein the tool or object is selected by an association with or a selection of the displayed icon. [76] 76. Collaborative input system, comprising a touch-sensitive surface for receiving a plurality of touch inputs, adapted to determine selection by touching an displayed icon; demonstrate additional input by touch; and associate this additional input with the selected icon. [77] The collaborative input system of claim 76, further adapted to associate additional input with the selected icon until another icon is selected. [78] The collaborative input system of claim 76, further adapted to associate additional inputs with the selected icon until the selected icon is deselected. [79] A collaborative input system according to claim 76, further adapted so that when selecting the displayed icon, the displayed icon is associated with a user. [80] The collaborative input system of claim 79, further adapted to provide a user registration process upon selection of the displayed user identification icon. [81] A collaborative input system according to any one of claims 76 to 80, further adapted so that the selection of an displayed icon comprises selecting the icon and dragging the icon to an edge of the touch sensitive surface. [82] The cooperative input system of claim 81, further adapted so that when it is detected that an icon is drawn to the edge of the touchpad, the icon is selected. [83] A cooperative input system according to any one of claims 76 to 82, further adapted so that in the selection a selected icon is aligned relative to an edge of the touch sensitive surface. 10 15 20 25 30 35 34 [84] The cooperating input system of claim 83, further adapted to depend on claim 81 or 82, wherein the selected icon is aligned with the edge to which it is drawn. [85] A cooperative input system according to any one of claims 76 to 84, further adapted so that the icon is placed at a predetermined distance from an edge of the touch-sensitive surface during selection. [86] A collaborative input system according to any one of claims 79 to 85, further adapted to provide a plurality of identification options for selection when selecting the displayed icon. r [87] A collaborative input system according to claim 86, further adapted so that the possibilities comprise a plurality of avatars or a plurality of images. [88] A collaborative input system according to claim 86 or 87, further adapted so that the selection of an additional displayed icon confirms the user's choice of possibilities. [89] A collaborative input system according to any one of claims 86 to 88, further adapted so that when a plurality of icons are selected by a number of users, a user selection process is maintained until all users have confirmed the selection of all possible options for each of the plurality of selected icons. [90] Cooperative input system according to any one of claims 76 to 89, further adapted so that the displayed and selected icon is repositioned on the touch-sensitive surface. [91] A collaborative input system according to any one of claims 76 to 90, further adapted so that a plurality of users select a plurality of displayed icons. [92] The collaborative input system of claim 91, further adapted to group two or more of the plurality of selected icons. [93] The cooperative input system of claim 92, further adapted so that the grouping is denoted by an identifier displayed on the grouped user icons. [94] A collaborative input system according to claim 92 or 93, further adapted so that the grouping of the user icons is determined by a user selecting a group identifier for his displayed icon. [95] A cooperative input system according to any one of claims 76 to 94, further adapted to remove a selected user icon by dragging the displayed icon away from the edge of the touch-sensitive surface. [96] A cooperative input system according to any one of claims 76 to 95, further adapted to define at least one active area on the touch-sensitive surface, the active area being reserved for use by one or more users. [97] A collaborative input system according to any one of claims 76 to 95, further adapted to define at least one active area on the touch sensitive surface, the active area being reserved for use by one or more applications. 10 15 20 25 30 35 35 [98] Cooperative input system according to any one of claims 76 to 95, further adapted to define a number of active areas on the touch-sensitive surface, the active areas being defined by selecting user icons. [99] A collaborative input system according to any one of claims 76 to 98, further adapted so that each selected icon is associated with a menu. [100] Cooperative input system according to claim 99, further adapted so that the menu is displayed by selecting the icon. [101] A collaborative input system according to any one of claims 76 to 100, further adapted to determine a position of a user associated with the displayed icon. [102] A collaborative input system according to claim 101, further adapted so that the position of the user is determined by means of the position of the displayed icon. [103] A cooperative input system according to any one of claims 76 to 102, further adapted to provide a plurality of displayed icons, each displayed icon being intended to be associated with a particular identity. [104] The cooperative input system of claim 103, further adapted so that the identity is a user identity. [105] A collaborative input system according to any one of claims 76 to 104, further adapted to select an option displayed after a selection of the displayed icon and associate the selected option with that icon. [106] A cooperative input system according to claim 105, further adapted so that the option is a tool selected in a tool menu, wherein the tool is displayed on the touch-sensitive surface, and wherein all input derived from the use of the tool is associated with the displayed one. icons. [107] A collaborative input system according to any one of claims 76 to 106, further adapted so that the displayed icon represents a database. [108] 108. Collaborative input systems comprising a touch-sensitive surface, adapted to determine the position relative to the surface of a plurality of users; and allocating inputs to one or more users depending on the determined positions based on the location of that input relative to the determined positions. [109] The collaborative input system of claim 108, further adapted to track the contribution of each user. [110] The collaborative input system of claim 109, further adapted to track the contribution of each user to a collaborative task. 10 15 20 25 30 35 36 [111] A collaborative input system according to any one of claims 108 to 110, further adapted so that the position of the user is determined depending on the position of an displayed icon associated with the user. [112] A collaborative input system according to claim 111, further adapted so that the displayed icon is placed by the user. [113] A cooperative input system according to claim 112, further adapted so that the displayed icon is placed along an edge of the touch-sensitive surface. [114] Cooperative input system comprising a touch sensitive surface according to any one of claims 76 to 107, further adapted to a. Locate the position of a number of users relative to the touch sensitive area depending on the location of a corresponding plurality of displayed icons associated with the number. by users; and b. associating inputs at the touch-sensitive surface with a specific user depending on the location of that input being close to the user's located location. [115] A collaborative input system according to claim 114, wherein the input is determined to be near a localized position for a user if it is located within a certain area around the user's localized position. [116] The cooperative input system of claim 114 or 115, wherein locating the position of a plurality of users comprises determining the position of the users relative to the edges of the touch-sensitive surface. [117] A collaborative input system according to any one of claims 114 to 116, further comprising determining the number of users. [118] A collaborative input system according to claims 114 to 117, further comprising tracking input from each user. [119] The collaborative input system of claim 18, further adapted to provide information about the contribution provided by each user. [120] A collaborative input system according to claims 114 to 119, wherein a system event is determined by a collective response from all users. [121] 121. Collaborative input systems comprising a touch-sensitive surface, adapted to a. Define an area associated with a user; and b. associate any information displayed in this area with this user. [122] A cooperative input system according to claim 121, wherein the area fi is defined by a physical area on the touch-sensitive surface. [123] The collaborative input system of claim 121, wherein the area is defined by a graphic icon displayed on the touch screen. 10 15 20 25 30 35 37 [124] A cooperative input system according to any one of claims 121 to 123, wherein the association comprises selecting displayed content and dragging it into the defined area. [125] A cooperative input system according to any one of claims 76 to 120, further comprising associating the displayed icon with any function located in the same location as it. [126] The cooperating input system of claim 125, wherein a function is associated with the displayed icon if another displayed icon representing this function is positioned to coincide with the displayed icon. [127] 127. Collaborative input systems comprising a touch-sensitive surface, adapted to a. Associate each of a plurality of users with a group; and b. track input from groups. [128] The collaborative input system of claim 127, wherein the association of each of a plurality of users with a group comprises each user joining a group. [129] A collaborative input system according to claim 127 or 128, further adapted to provide a plurality of user groups. [130] The collaborative input system of claims 127 to 129, wherein the group is defined by an application and the association of a user with a group comprises the user joining the defined group. [131] A collaborative input system according to any one of claims 76 to 126, further adapted to provide a plurality of displayed icons, each associated with one of a plurality of users, two or more displayed icons being associated with a grouping. [132] A collaborative input system according to claim 131, wherein a common array of two or more displayed icons is denoted by a common element of the displayed icons. [133] The cooperative input system of claim 131 or 132, wherein one or more predetermined groups are provided, and further associating an displayed icon with an array comprises associating the displayed icon with a predetermined group. [134] A collaborative input system according to any one of claims 131 to 133, wherein input associated with an displayed icon associated with a grouping is tracked to determine an individual's contribution to a group. [135] 135. Collaborative input system comprising a touch-sensitive interactive display surface, adapted to display a plurality of user interface elements, each associated with one of a plurality of users, a step of an application running in a computer system connected to the touch-sensitive interactive display is dependent on a selection made by each user at the respective user interface element. [136] The collaborative input system of claim 135, wherein the step of an application running in a computer system coupled to the touch-sensitive interactive display is dependent on the choice made by each user at the respective user interface element being the same choice. [137] The collaborative input system of claim 136, wherein the selection made by each user comprises a selection of a user identity. = [138] A collaborative input system according to any one of claims 76 to 134, further adapted to provide a plurality of displayed icons, each for association with a plurality of users, wherein a step of an application running in a computer system connected to it the touch-sensitive surface depends on an input associated with each of the plurality of icons. [139] The collaborative input system of claim 138, wherein the step of the application is dependent on each of the plurality of displayed icons being associated with a user. [140] 140. Collaborative input system adapted to track a user's position relative to the edge of a surface of a touch sensitive interactive display surface by providing for each user a user icon, which is displayed on the display surface, the current position of the displayed user icon representing the user's. current position. [141] The collaborative input system of claim 140, wherein movement of a user's icon for a user represents movement of a user's position. [142] A collaborative input system according to claim 140 or 141, wherein inputs detected within an area close to the user icon are associated with the user associated with the user icon. [143] A collaborative input system according to any one of claims 76 to 140, wherein the displayed icon is associated with a user, a current position of the displayed icon representing the user's current position relative to the touch sensitive surface. [144] A cooperative input system according to claim 143, wherein the position of the user is determined depending on the fact that the displayed icon is located at or near an edge of the touch-sensitive surface. [145] 145. Collaborative input system comprising an interactive touch screen display, adapted to provide a plurality of user icons representing a corresponding plurality of users; and dividing the interactive display area with touch screen into a plurality of work areas, each work area being adapted to receive input from a 39 or more specified users, the users being specified using the location of their associated user icon associated with the work area. [146] A cooperative input system according to claim 145, wherein the working areas are defined by their proximity to parts of the edges of the interactive display surface. [147] A collaborative input system according to any one of claims 76 to 145, further adapted to display a plurality of icons, each associated with one of a plurality of users, the touch screen display area consisting of at least one working area which is a subdivision of the whole area, with input detected in such a work area being associated with a user depending on the displayed icon associated with that user being within the work area.
类似技术:
公开号 | 公开日 | 专利标题 SE1150221A1|2012-04-16|Input association for touch sensitive surface JP6348211B2|2018-06-27|Remote control of computer equipment Bezerianos et al.2005|The vacuum: facilitating the manipulation of distant objects CN105378599B|2018-11-06|interactive digital display CN102947783B|2015-12-16|Multi-point touch mark menu and directive chord gesture SE1150215A1|2012-07-13|Common UI resources Hürst et al.2013|Gesture-based interaction via finger tracking for mobile augmented reality Forlines et al.2006|Hybridpointing: fluid switching between absolute and relative pointing with a direct input device JP4045550B2|2008-02-13|Image display control apparatus and image display control program US20170308348A1|2017-10-26|System and method for very large-scale communication and asynchronous documentation in virtual reality and augmented reality environments US20170228138A1|2017-08-10|System and method for spatial interaction for viewing and manipulating off-screen content CN104285203B|2018-04-03|Message processing device, method and storage medium for control information processing equipment TWI705356B|2020-09-21|Input method and device in virtual reality scene JP6875346B2|2021-05-26|Information processing methods and devices, storage media, electronic devices Bachl et al.2011|The effects of personal displays and transfer techniques on collaboration strategies in multi-touch based multi-display environments Remy et al.2010|A pattern language for interactive tabletops in collaborative workspaces KR102109003B1|2020-05-11|Association mapping game TW201313004A|2013-03-16|Intelligent input system and input device and electronic equipment thereof Rohs et al.2006|Which one is better? information navigation techniques for spatially aware handheld displays JP6353795B2|2018-07-04|Terminal combined display system and terminal combined display program TWI429276B|2014-03-01|Intelligent input method Samini2018|Perspective Correct Hand-held Augmented Reality for Improved Graphics and Interaction JP6827717B2|2021-02-10|Information processing equipment, control methods and programs for information processing equipment JP2015177432A|2015-10-05|Cue point control device and cue point control program Chen et al.2018|Unobtrusive touch‐free interaction on mobile devices in dirty working environments
同族专利:
公开号 | 公开日 GB201017505D0|2010-12-01| WO2012049199A1|2012-04-19| GB2484551A|2012-04-18| US20140149889A1|2014-05-29|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题 US5821933A|1995-09-14|1998-10-13|International Business Machines Corporation|Visual access to restricted functions represented on a graphical user interface| KR100456487B1|2004-08-02|2004-11-09|엔에이치엔|personal icon providing system and method thereof| US20060181519A1|2005-02-14|2006-08-17|Vernier Frederic D|Method and system for manipulating graphical objects displayed on a touch-sensitive display surface using displaced pop-ups| US20080307339A1|2006-03-20|2008-12-11|Kidzui, Inc.|Child-oriented computing system| JP2007128288A|2005-11-04|2007-05-24|Fuji Xerox Co Ltd|Information display system| US8970502B2|2006-05-26|2015-03-03|Touchtable Ab|User identification for multi-user touch screens| US8619038B2|2007-09-04|2013-12-31|Apple Inc.|Editing interface| US20100179864A1|2007-09-19|2010-07-15|Feldman Michael R|Multimedia, multiuser system and associated methods| US8009147B2|2007-09-27|2011-08-30|At&T Intellectual Property I, Lp|Multi-touch interfaces for user authentication, partitioning, and external device control| US8545321B2|2007-11-09|2013-10-01|Igt|Gaming system having user interface with uploading and downloading capability| TWI361377B|2008-04-24|2012-04-01|Htc Corp|Method for switching user interface, electronic device and recording medium using the same| KR100969790B1|2008-09-02|2010-07-15|엘지전자 주식회사|Mobile terminal and method for synthersizing contents| US8427424B2|2008-09-30|2013-04-23|Microsoft Corporation|Using physical objects in conjunction with an interactive surface| TWI463355B|2009-02-04|2014-12-01|Mstar Semiconductor Inc|Signal processing apparatus, signal processing method and selecting method of user-interface icon for multi-touch interface| US9141275B2|2009-02-17|2015-09-22|Hewlett-Packard Development Company, L.P.|Rendering object icons associated with a first object icon upon detecting fingers moving apart| KR101640460B1|2009-03-25|2016-07-18|삼성전자 주식회사|Operation Method of Split Window And Portable Device supporting the same| GB0908456D0|2009-05-18|2009-06-24|L P|Touch screen, related method of operation and systems| US20120054657A1|2010-08-31|2012-03-01|Nokia Corporation|Methods, apparatuses and computer program products for enabling efficent copying and pasting of data via a user interface| US9264233B2|2011-06-27|2016-02-16|Hatch, Inc.|System and method for a log-in procedure for non-readers| US20130082824A1|2011-09-30|2013-04-04|Nokia Corporation|Feedback response|JP5879536B2|2012-01-18|2016-03-08|パナソニックIpマネジメント株式会社|Display device and display method| CN103577381B|2012-07-30|2017-08-25|华为技术有限公司|A kind of equipment room shares the collocation method and system of input unit| CN103838483B|2012-11-27|2018-02-27|联想有限公司|A kind of display methods and electronic equipment| USD857035S1|2014-04-11|2019-08-20|Johnson Controls Technology Company|Display screen or portion thereof with graphical user interface| USD788785S1|2014-04-11|2017-06-06|Johnson Controls Technology Company|Display having a graphical user interface| FR3026204B1|2014-09-24|2017-12-15|Virtual Sensitive|METHOD FOR THE SPATIAL MANAGEMENT OF INTERACTIVE AREAS OF A TOUCH TABLE, TOUCH TABLE| USD767619S1|2014-12-24|2016-09-27|Chia-Hua Lin|Display screen with graphical user interface| US10088993B2|2015-04-01|2018-10-02|Ebay Inc.|User interface for controlling data navigation| USD772930S1|2015-07-07|2016-11-29|Adp, Llc|Display screen with icon| USD817981S1|2015-11-10|2018-05-15|International Business Machines Corporation|Display screen with an object relation mapping graphical user interface| CN105892911A|2016-03-28|2016-08-24|联想有限公司|Information processing method and apparatus| USD872120S1|2017-10-26|2020-01-07|Herdx, Inc.|Flat soft touch control panel user interface| USD874478S1|2018-01-30|2020-02-04|Magic Leap, Inc.|Display panel or portion thereof with graphical user interface|
法律状态:
2015-05-26| NAV| Patent application has lapsed|
优先权:
[返回顶部]
申请号 | 申请日 | 专利标题 GB201017505A|GB2484551A|2010-10-15|2010-10-15|Input association for touch sensitive surface| 相关专利
Sulfonates, polymers, resist compositions and patterning process
Washing machine
Washing machine
Device for fixture finishing and tension adjusting of membrane
Structure for Equipping Band in a Plane Cathode Ray Tube
Process for preparation of 7 alpha-carboxyl 9, 11-epoxy steroids and intermediates useful therein an
国家/地区
|